Close

1. Identity statement
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Holder Codeibi 8JMKD3MGPEW34M/46T9EHH
Identifier8JMKD3MGPEW34M/45CUSQ5
Repositorysid.inpe.br/sibgrapi/2021/09.06.22.26
Last Update2021:09.06.22.26.16 (UTC) administrator
Metadata Repositorysid.inpe.br/sibgrapi/2021/09.06.22.26.16
Metadata Last Update2022:06.14.00.00.32 (UTC) administrator
DOI10.1109/SIBGRAPI54419.2021.00018
Citation KeyEscherDrewBem:2021:FaSpTr
TitleFast Spatial-Temporal Transformer Network
FormatOn-line
Year2021
Access Date2024, May 06
Number of Files1
Size10314 KiB
2. Context
Author1 Escher, Rafael Molossi
2 Drews-Jr, Paulo
3 Bem, Rodrigo Andrade de
Affiliation1 Federal University of Rio Grande 
2 Federal University of Rio Grande 
3 Federal University of Rio Grande
EditorPaiva, Afonso
Menotti, David
Baranoski, Gladimir V. G.
Proença, Hugo Pedro
Junior, Antonio Lopes Apolinario
Papa, João Paulo
Pagliosa, Paulo
dos Santos, Thiago Oliveira
e Sá, Asla Medeiros
da Silveira, Thiago Lopes Trugillo
Brazil, Emilio Vital
Ponti, Moacir A.
Fernandes, Leandro A. F.
Avila, Sandra
e-Mail Addressrafael-escher@hotmail.com
Conference NameConference on Graphics, Patterns and Images, 34 (SIBGRAPI)
Conference LocationGramado, RS, Brazil (virtual)
Date18-22 Oct. 2021
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeFull Paper
History (UTC)2021-09-06 22:26:16 :: rafael-escher@hotmail.com -> administrator ::
2022-03-02 00:54:16 :: administrator -> menottid@gmail.com :: 2021
2022-03-02 13:24:39 :: menottid@gmail.com -> administrator :: 2021
2022-06-14 00:00:32 :: administrator -> :: 2021
3. Content and structure
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Version Typefinaldraft
KeywordsDeep Learning
Video Inpainting
Reformer Networks
Transformer Networks
AbstractIn computer vision, the restoration of missing regions in an image can be tackled with image inpainting techniques. Neural networks that perform inpainting in videos require the extraction of information from neighboring frames to obtain a temporally coherent result. The state-of-the-art methods for video inpainting are mainly based on Transformer Networks, which rely on attention mechanisms to handle temporal input data. However, such networks are highly costly, requiring considerable computational power for training and testing, which hinders its use on modest computing platforms. In this context, our goal is to reduce the computational complexity of state-ofthe-art video inpainting methods, improving performance and facilitating its use in low-end GPUs. Therefore, we introduce the Fast Spatio-Temporal Transformer Network (FastSTTN), an extension of the Spatio-Temporal Transformer Network (STTN) in which the adoption of Reversible Layers reduces memory usage up to 7 times and execution time by approximately 2.2 times, while maintaining state-of-the-art video inpainting accuracy.
Arrangement 1urlib.net > SDLA > Fonds > SIBGRAPI 2021 > Fast Spatial-Temporal Transformer...
Arrangement 2urlib.net > SDLA > Fonds > Full Index > Fast Spatial-Temporal Transformer...
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 06/09/2021 19:26 1.3 KiB 
4. Conditions of access and use
data URLhttp://urlib.net/ibi/8JMKD3MGPEW34M/45CUSQ5
zipped data URLhttp://urlib.net/zip/8JMKD3MGPEW34M/45CUSQ5
Languageen
Target FileFastSTTN___SIBGRAPI_2021.pdf
User Grouprafael-escher@hotmail.com
Visibilityshown
Update Permissionnot transferred
5. Allied materials
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPEW34M/45PQ3RS
8JMKD3MGPEW34M/4742MCS
Citing Item Listsid.inpe.br/sibgrapi/2021/11.12.11.46 5
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
6. Notes
Empty Fieldsarchivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination edition electronicmailaddress group isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder schedulinginformation secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url volume


Close